17 research outputs found

    Investigating benchmark correlations when comparing algorithms with parameter tuning: detailed experiments and results.

    Get PDF
    Benchmarks are important to demonstrate the utility of optimisation algorithms, but there is controversy about the practice of benchmarking; we could select instances that present our algorithm favourably, and dismiss those on which our algorithm underperforms. Several papers highlight the pitfalls concerned with benchmarking, some of which concern the context of the automated design of algorithms, where we use a set of problem instances (benchmarks) to train our algorithm. As with machine learning, if the training set does not reflect the test set, the algorithm will not generalize. This raises some open questions concerning the use of test instances to automatically design algorithms. We use differential evolution and sweep the parameter settings to investigate the practice of benchmarking using the BBOB benchmarks. We make three key findings. Firstly, several benchmark functions are highly correlated. This may lead to the false conclusion that an algorithm performs well in general, when it performs poorly on a few key instances, possibly introducing unwanted bias to a resulting automatically designed algorithm. Secondly, the number of evaluations can have a large effect on the conclusion. Finally, a systematic sweep of the parameters shows how performance varies with time across the space of algorithm configurations. The datasets, including all computed features, the evolved policies and their performances, and the visualisations for all feature sets are available from the University of Stirling Data Repository (http://hdl.handle.net/11667/109)

    Investigating benchmark correlations when comparing algorithms with parameter tuning.

    Get PDF
    Benchmarks are important for comparing performance of optimisation algorithms, but we can select instances that present our algorithm favourably, and dismiss those on which our algorithm under-performs. Also related are automated design of algorithms, which use problem instances (benchmarks) to train an algorithm: careful choice of instances is needed for the algorithm to generalise. We sweep parameter settings of differential evolution to applied to the BBOB benchmarks. Several benchmark functions are highly correlated. This may lead to the false conclusion that an algorithm performs well in general, when it performs poorly on a few key instances. These correlations vary with the number of evaluations

    The Markov network fitness model

    Get PDF
    Fitness modelling is an area of research which has recently received much interest among the evolutionary computing community. Fitness models can improve the efficiency of optimisation through direct sampling to generate new solutions, guiding of traditional genetic operators or as surrogates for a noisy or long-running fitness functions. In this chapter we discuss the application of Markov networks to fitness modelling of black-box functions within evolutionary computation, accompanied by discussion on the relationship betweenMarkov networks andWalsh analysis of fitness functions.We review alternative fitness modelling and approximation techniques and draw comparisons with the Markov network approach. We discuss the applicability of Markov networks as fitness surrogates which may be used for constructing guided operators or more general hybrid algorithms.We conclude with some observations and issues which arise from work conducted in this area so far

    A chance-constrained programming model for airport ground movement optimisation with taxi time uncertainties

    Get PDF
    Airport ground movement remains a major bottleneck for air traffic management. Existing approaches have developed several routing allocation methods to address this problem, in which the taxi time traversing each segment of the taxiways is fixed. However, taxi time is typically difficult to estimate in advance, since its uncertainties are inherent in the airport ground movement optimisation due to various unmodelled and unpredictable factors. To address the optimisation of taxi time under uncertainty, we introduce a chance-constrained programming model with sample approximation, in which a set of scenarios is generated in accordance with taxi time distributions. A modified sequential quickest path searching algorithm with local heuristic is then designed to minimise the entire taxi time. Working with real-world data at an international airport, we compare our proposed method with the state-of-the-art algorithms. Extensive simulations indicate that our proposed method efficiently allocates routes with smaller taxiing time, as well as fewer aircraft stops during the taxiing process

    A multi-objective window optimisation problem

    Get PDF
    We present an optimisation problem which seeks to locate the Pareto-optimal front of building window and shading designs minimising two objectives: projected energy use of the operational building and its construction cost. This problem is of particular interest because it has many variable interactions and each function evaluation is relatively timeconsuming. It also makes use of a freely-available building simulation program EnergyPlus which may be used in many other building design optimisation problems. We describe the problem and report the results of experiments comparing the performance of a number of existing multi-objective evolutionary algorithms applied to it. We conclude that this represents a promising real-world application area

    Towards explainable metaheuristics: feature extraction from trajectory mining.

    Get PDF
    Explaining the decisions made by population-based metaheuristics can often be considered difficult due to the stochastic nature of the mechanisms employed by these optimisation methods. As industries continue to adopt these methods in areas that increasingly require end-user input and confirmation, the need to explain the internal decisions being made has grown. In this article, we present our approach to the extraction of explanation supporting features using trajectory mining. This is achieved through the application of principal components analysis techniques to identify new methods of tracking population diversity changes post-runtime. The algorithm search trajectories were generated by solving a set of benchmark problems with a genetic algorithm and a univariate estimation of distribution algorithm and retaining all visited candidate solutions which were then projected to a lower dimensional sub-space. We also varied the selection pressure placed on high fitness solutions by altering the selection operators. Our results show that metrics derived from the projected sub-space algorithm search trajectories are capable of capturing key learning steps and how solution variable patterns that explain the fitness function may be captured in the principal component coefficients. A comparative study of variable importance rankings derived from a surrogate model built on the same dataset was also performed. The results show that both approaches are capable of identifying key features regarding variable interactions and their influence on fitness in a complimentary fashion

    Multi-dwelling refurbishment optimization: problem decomposition, solution and trade-off analysis

    Get PDF
    A methodology has been developed for the multiobjective optimization of the refurbishment of domestic building stock on a regional scale. The approach is based on the decomposition of the problem into two stages: first to find the energy-cost trade-off for individual houses, and then to apply it tomultiple houses. The approach has been applied to 759 dwellings using buildings data from a survey of the UK housing stock. The energy use of each building and their refurbished variants were simulated using EnergyPlus using automatically-generated input files. The variation in the contributing refurbishment options from least to highest cost along the Pareto front shows loft and cavity wall insulation to be optimal intially, and solid wall insulation and double glazing appearing later

    The intersection of evolutionary computation and explainable AI.

    Get PDF
    In the past decade, Explainable Artificial Intelligence (XAI) has attracted a great interest in the research community, motivated by the need for explanations in critical AI applications. Some recent advances in XAI are based on Evolutionary Computation (EC) techniques, such as Genetic Programming. We call this trend EC for XAI. We argue that the full potential of EC methods has not been fully exploited yet in XAI, and call the community for future efforts in this field. Likewise, we find that there is a growing concern in EC regarding the explanation of population-based methods, i.e., their search process and outcomes. While some attempts have been done in this direction (although, in most cases, those are not explicitly put in the context of XAI), we believe that there are still several research opportunities and open research questions that, in principle, may promote a safer and broader adoption of EC in real-world applications. We call this trend XAI within EC. In this position paper, we briefly overview the main results in the two above trends, and suggest that the EC community may play a major role in the achievement of XAI

    Generating easy and hard problems using the proximate optimality principle.

    Get PDF
    We present an approach to generating problems of variable difficulty based on the well-known Proximate Optimality Principle (POP), often paraphrased as similar solutions have similar fitness. We explore definitions of this concept in terms of metrics in objective space and in representation space and define POP in terms of coherence of these metrics. We hypothesise that algorithms will perform well when the neighbourhoods they explore in representation space are coherent with the natural metric induced by fitness on objective space. We develop an explicit method of problem generation which creates bit string problems where the natural fitness metric is coherent or anti-coherent with Hamming neighbourhoods. We conduct experiments to show that coherent problems are easy whereas anti-coherent problems are hard for local hill climbers using the Hamming neighbourhoods

    Structural coherence of problem and algorithm: an analysis for EDAs on all 2-bit and 3-bit problems.

    Get PDF
    Metaheuristics assume some kind of coherence between decision and objective spaces. Estimation of Distribution algorithms approach this by constructing an explicit probabilistic model of high fitness solutions, the structure of which is intended to reflect the structure of the problem. In this context, 'structure' means the dependencies or interactions between problem variables in a probabilistic graphical model. There are many approaches to discovering these dependencies, and existing work has already shown that often these approaches discover 'unnecessary' elements of structure - that is, elements which are not needed to correctly rank solutions. This work performs an exhaustive analysis of all 2 and 3 bit problems, grouped into classes based on mononotic invariance. It is shown in [1] that each class has a minimal Walsh structure that can be used to solve the problem. We compare the structure discovered by different structure learning approaches to the minimal Walsh structure for each class, with summaries of which interactions are (in)correctly identified. Our analysis reveals a large number of symmetries that may be used to simplify problem solving. We show that negative selection can result in improved coherence between discovered and necessary structure, and conclude with some directions for a general programme of study building on this work
    corecore